623 research outputs found

    Estimating packet loss rate in the access through application-level measurements

    Get PDF
    End user monitoring of quality of experience is one of the necessary steps to achieve an effective and winning control over network neutrality. The involvement of the end user, however, requires the development of light and user-friendly tools that can be easily run at the application level with limited effort and network resources usage. In this paper, we propose a simple model to estimate packet loss rate perceived by a connection, by round trip time and TCP goodput samples collected at the application level. The model is derived from the well-known Mathis equation, which predicts the bandwidth of a steady-state TCP connection under random losses and delayed ACKs and it is evaluated in a testbed environment under a wide range of different conditions. Experiments are also run on real access networks. We plan to use the model to analyze the results collected by the "network neutrality bot" (Neubot), a research tool that performs application-level network-performance measurements. However, the methodology is easily portable and can be interesting for basically any user application that performs large downloads or uploads and requires to estimate access network quality and its variation

    Is there such a thing as free government data?

    Get PDF
    The recently-amended European Public Sector Information (PSI) Directive rests on the assumption that government data is a valuable input for the knowledge economy. As a default principle, the directive sets marginal costs as an upper bound for charging PSI. This article discusses the terms under which the 2013 consultation on the implementation of the PSI Directive addresses the calculation criteria for marginal costs, which are complex to define, especially for internet-based services. What is found is that the allowed answers of the consultation indirectly lead the responder to reason in terms of the average incremental cost of allowing reuse, instead of the marginal cost of reproduction, provision and dissemination. Moreover, marginal-cost pricing (or zero pricing) is expected to lead to economically efficient results, while aiming at recouping the average incremental cost of allowing re-use may lead to excessive fees

    Strengthening measurements from the edges: application-level packet loss rate estimation

    Get PDF
    Network users know much less than ISPs, Internet exchanges and content providers about what happens inside the network. Consequently users cannot either easily detect network neutrality violations or readily exercise their market power by knowledgeably switching ISPs. This paper contributes to the ongoing efforts to empower users by proposing two models to estimate -- via application-level measurements -- a key network indicator, i.e., the packet loss rate (PLR) experienced by FTP-like TCP downloads. Controlled, testbed, and large-scale experiments show that the Inverse Mathis model is simpler and more consistent across the whole PLR range, but less accurate than the more advanced Likely Rexmit model for landline connections and moderate PL

    Collaborative Open Data versioning: a pragmatic approach using Linked Data

    Get PDF
    Most Open Government Data initiatives are centralised and unidirectional (i.e., they release data dumps in CSV or PDF format). Hence for non trivial applications reusers make copies of the government datasets to curate their local data copy. This situation is not optimal as it leads to duplication of efforts and reduces the possibility of sharing improvements. To improve the usefulness of publishing open data, several authors recommeded to use standard formats and data versioning. Here we focus on publishing versioned open linked data (i.e., in RDF format) because they allow one party to annotate data released independently by another party thus reducing the need to duplicate entire datasets. After describing a pipeline to open up legacy-databases data in RDF format, we argue that RDF is suitable to implement a scalable feedback channel, and we investigate what steps are needed to implement a distributed RDFversioning system in production

    How do universities use social media? An empirical survey of Italian academic institutions

    Get PDF
    This work describes how Italian universities use social media, with a focus on Facebook and Twitter. Empirical data about the online features and behaviour of the social media accounts of Italian universities was gathered using several qualitative and quantitative data collection techniques, including automatic data collection, ad-hoc API queries and information obtained from the university personnel managing the accounts. The results of the ‘SocialUniversity’ project show that most Italian universities have active social network accounts; that Facebook is the platform of choice to answer the students’ questions, while Twitter serves mostly as an online news channel; that Italian universities on average use social media platforms generally better than the Italian public administration; that in the specific subset of technical universities, a few Italian institutions have an online footprint comparable to some of the top European technical universities (e.g., the Swiss Federal Institute of Technology in Zurich)

    Measuring DASH Streaming Performance from the End Users Perspective using Neubot

    Get PDF
    The popularity of DASH streaming is rapidly increasing and a number of commercial streaming services are adopting this new standard. While the benefits of building streaming services on top of the HTTP protocol are clear, further work is still necessary to evaluate and enhance the system performance from the perspective of the end user. Here we present a novel framework to evaluate the performance of rate-adaptation algorithms for DASH streaming using network measurements collected from more than a thousand Internet clients. Data, which have been made publicly available, are collected by a DASH module built on top of Neubot, an open source tool for the collection of network measurements. Some examples about the possible usage of the collected data are given, ranging from simple analysis and performance comparisons of download speeds to the performance simulation of alternative adaptation strategies using, e.g., the instantaneous available bandwidth value

    The NeuViz Data Visualization Tool for Visualizing Internet-Measurements Data

    Get PDF
    In this paper we present NeuViz, a data processing and visualization architecture for network measurement experiments. NeuViz has been tailored to work on the data produced by Neubot (Net Neutrality Bot), an Internet bot that performs periodic, active network performance tests. We show that NeuViz is an effective tool to navigate Neubot data to identify cases (to be investigated with more specific network tests) in which a protocol seems discriminated. Also, we suggest how the information provided by the NeuViz Web API can help to automatically detect cases in which a protocol seems discriminated, to raise warnings or trigger more specific tests
    corecore